15 research outputs found

    Belief merging within fragments of propositional logic

    Full text link
    Recently, belief change within the framework of fragments of propositional logic has gained increasing attention. Previous works focused on belief contraction and belief revision on the Horn fragment. However, the problem of belief merging within fragments of propositional logic has been neglected so far. This paper presents a general approach to define new merging operators derived from existing ones such that the result of merging remains in the fragment under consideration. Our approach is not limited to the case of Horn fragment but applicable to any fragment of propositional logic characterized by a closure property on the sets of models of its formulae. We study the logical properties of the proposed operators in terms of satisfaction of merging postulates, considering in particular distance-based merging operators for Horn and Krom fragments.Comment: To appear in the Proceedings of the 15th International Workshop on Non-Monotonic Reasoning (NMR 2014

    The parameterized complexity of positional games

    Get PDF
    We study the parameterized complexity of several positional games. Our main result is that Short Generalized Hex is W[1]-complete parameterized by the number of moves. This solves an open problem from Downey and Fellows’ influential list of open problems from 1999. Previously, the problem was thought of as a natural candidate for AW[*]-completeness. Our main tool is a new fragment of first-order logic where universally quantified variables only occur in inequalities. We show that model-checking on arbitrary relational structures for a formula in this fragment is W[1]-complete when parameterized by formula size. We also consider a general framework where a positional game is represented as a hypergraph and two players alternately pick vertices. In a Maker-Maker game, the first player to have picked all the vertices of some hyperedge wins the game. In a Maker-Breaker game, the first player wins if she picks all the vertices of some hyperedge, and the second player wins otherwise. In an Enforcer-Avoider game, the first player wins if the second player picks all the vertices of some hyperedge, and the second player wins otherwise. Short Maker-Maker, Short Maker-Breaker, and Short Enforcer-Avoider are respectively AW[*]-, W[1]-, and co-W[1]-complete parameterized by the number of moves. This suggests a rough parameterized complexity categorization into positional games that are complete for the first level of the W-hierarchy when the winning condition only depends on which vertices one player has been able to pick, but AW[*]-complete when it depends on which vertices both players have picked. However, some positional games with highly structured board and winning configurations are fixed-parameter tractable. We give another example of such a game, Short k-Connect, which is fixed-parameter tractable when parameterized by the number of moves

    Disseminated cancer cells detected by immunocytology in lymph nodes of NSCLC patients are highly prognostic and undergo parallel molecular evolution

    Get PDF
    In melanoma, immunocytology (IC) after sentinel lymph node disaggregation not only enables better quantification of disseminated cancer cells (DCCs) than routine histopathology (HP) but also provides a unique opportunity to detect, isolate, and analyse these earliest harbingers of metachronous metastasis. Here, we explored lymph node IC in non-small cell lung cancer (NSCLC). For 122 NSCLC patients, 220 lymph nodes (LNs) were split in half and prepared for IC and HP. When both methods were compared, IC identified 22% positive patients as opposed to 4.5% by HP, revealing a much higher sensitivity of IC (p < 0.001). Assessment of all available 2,952 LNs of the same patients by HP uncovered additional patients escaping detection of lymphatic tumour spread by IC alone, consistent with the concept of skip metastasis. A combined lymph node status of IC and complete HP on a larger cohort of patients outperformed all risk factors in multivariable analysis for prognosis (p < 0.001; RR = 2.290; CI 1.407–3.728). Moreover, isolation of DCCs and single-cell molecular characterization revealed that (1) LN-DCCs differ from primary tumours in terms of copy number alterations and selected mutations and (2) critical alterations are acquired during colony formation within LNs. We conclude that LN-IC in NSCLC patients when combined with HP improves diagnostic precision, has the potential to reduce total workload, and facilitates molecular characterization of lymphatically spread cancer cells, which may become key for the selection and development of novel systemic therapies. © 2022 The Authors. The Journal of Pathology published by John Wiley & Sons Ltd on behalf of The Pathological Society of Great Britain and Ireland

    Efficient counting with bounded treewidth using datalog

    No full text
    Zsfassung in dt. SpracheBounded treewidth has proven to be a key concept in identifying tractable fragments of inherently intractable problems. An important result in this context is Courcelle's Theorem, stating that any property of finite structures definable in monadic second-order logic (MSO), becomes tractable if the treewidth of the structure is bounded by a constant. An extension of this result to counting problems was done by Arnborg et al. But both proofs did not yield an implementable algorithm.Recently Gottlob et al.\ presented a new approach using monadic datalog to close this gap for decision problems. The goal of this work is to extend this method in order to handle counting problems as well. We show that the monadic datalog approach indeed is applicable for all MSO definable counting problems. Furthermore we propose concrete algorithms with fixed-parameter linear running time for the problems #SAT, #CIRCUMSCRIPTION and #HORN-ABDUCTION.7

    The parameterized complexity of nonmonotonic reasoning

    No full text
    Zsfassung in dt. SpracheIn unserem Alltag erleben wir Situationen, in denen sich bereits getĂ€tigte Schlussfolgerungen als ungĂŒltig erweisen, weil neue Informationen verfĂŒgbar werden, die wir vorher nicht kannten.Menschliches Denken ist nicht nur in der Lage mit solchen nichtmonotonen Situationen umzugehen, sondern tatsĂ€chlich wird diese Art des logischen Denkens permanent von Menschen durchgefĂŒhrt. Daher ist es nicht verwunderlich, dass sich viel Forschung in den Bereichen der kĂŒnstlichen Intelligenz (KI) und der WissensreprĂ€sentation mit Formalismen zu sogenanntem nichtmonotonen Schließen auseinandersetzt.Beispiele fĂŒr solche Formalismen sind Wissensrevision (Belief Revision), Antwortmengenprogrammierung (Answer-Set Programming) und (aussagenlogische) Abduktion (propositional Abduction). Eine große HĂŒrde, welche die praktische Anwendbarkeit von maschinellem nichtmonotonen Schließen begrenzt, ist die hohe KomplexitĂ€t dieser Aufgaben. Ein vielversprechender Ansatz im Umgang mit Problemen hoher KomplexitĂ€t ist zu untersuchen, welche Eigenschaften jene Probleminstanzen haben, die eine effiziente Berechnung ihrer Lösung erlauben. Solche Fragmente sind nicht auf syntaktische Kriterien beschrĂ€nkt, wie sie zum Beispiel entstehen wenn nur Horn-Formeln anstatt uneingeschrĂ€nkte Formeln der Aussagenlogik betrachtet werden.Stattdessen sind wir an strukturellen Fragmenten interessiert, da man aus diesen oft etwas ĂŒber den strukturellen Aufbau bestimmter Probleminstanzen lernen kann.Die Methodik, mit welcher die Suche nach solchen strukturellen Fragmenten durchgefĂŒhrt werden kann, wird als parametrisierte KomplexitĂ€tstheorie bezeichnet. Diese ermöglicht eine multivariate Analyse der KomplexitĂ€t des Problems. Dabei stellt die Menge der als Eingabe zur VerfĂŒgung gestellten Daten nur eine der Dimensionen dar. Die anderen Dimensionen die untersucht werden, nennt man Parameter.Um die hohe KomplexitĂ€t der Probleme aus dem Bereich des nichtmonotonen Schließens zu ĂŒberwinden, suchen wir Parameter oder Kombinationen von Parametern, so dass die komplette HĂ€rte des Problems in diesen beschrĂ€nkt werden kann. Das bedeutet dann, dass Fragmente von Probleminstanzen mit hinreichend niedrigen Parameterwerten tatsĂ€chlich effizient gelöst werden können.In dieser Arbeit untersuchen wir die drei oben genannten Formalismen fĂŒr maschinelles nichtmonotones Schließen. Wir initiieren die Erforschung der parametrisierten KomplexitĂ€t der Wissensrevision sowie der aussagenlogischen Abduktion. DarĂŒber hinaus tragen wir erheblich zur Erweiterung des Wissensstandes bezĂŒglich der parametrisierten KomplexitĂ€t der Antwortmengenprogrammierung bei.In our daily life we encounter situations where already drawn conclusions can turn out to be invalid because new information becomes available which we did not know before. Human reasoning is not only capable of dealing with such nonmonotonic situations, but also is this kind of reasoning permanently performed by humans. Hence, it is no surprise that lots of research in artificial intelligence (AI) and knowledge representation (KR) is conducted in order to study so-called nonmonotonic reasoning formalisms.Examples of such formalisms are belief revision, answer-set programming, and (propositional) abduction. A big obstacle that limits the practical applicability of computational nonmonotonic reasoning is the high complexity of these tasks. A promising approach to dealing with high computational complexity is to study what kind of properties those problem instances have that can be solved efficiently. Such tractable fragments are not restricted to syntactical limitations, such as Horn formulas instead of general propositional formulas. We are more interested in structural fragments since they often tell us something about the (hidden) structure of certain problem instances.The framework within which the search for such structural fragments can be conducted is called parameterized complexity theory. Thereby a multivariate complexity analysis of the problem is performed, where the input size is just one dimension. The other dimensions that are studied are called parameters.To overcome the high complexity of nonmonotonic reasoning problems, we seek parameters or combinations of parameters such that the computational hardness of the problem can be confined in them. This means that if we consider fragments of problem instances with sufficiently small parameter values, we obtain new tractable fragments.In this thesis we will study the three above mentioned nonmonotonic reasoning formalisms. We initiate the research of parameterized complexity of belief revision as well as propositional abduction.Furthermore, we significantly advance the state-of-the-art of parameterized complexity of answer-set programming.10

    Belief Merging within Fragments of Propositional Logic

    No full text
    International audienc
    corecore